85 research outputs found

    PySilSub: An open-source Python toolbox for implementing the method of silent substitution in vision and nonvisual photoreception research

    Get PDF
    The normal human retina contains several classes of photosensitive cell—rods for low-light vision, three cone classes for daylight vision, and intrinsically photosensitive retinal ganglion cells (ipRGCs) expressing melanopsin for non-image-forming functions, including pupil control, melatonin suppression, and circadian photoentrainment. The spectral sensitivities of the photoreceptors overlap significantly, which means that most lights will stimulate all photoreceptors to varying degrees. The method of silent substitution is a powerful tool for stimulating individual photoreceptor classes selectively and has found much use in research and clinical settings. The main hardware requirement for silent substitution is a spectrally calibrated light stimulation system with at least as many primaries as there are photoreceptors under consideration. Device settings that will produce lights to selectively stimulate the photoreceptor(s) of interest can be found using a variety of analytic and algorithmic approaches. Here we present PySilSub (https://github.com/PySilentSubstitution/pysilsub), a novel Python package for silent substitution featuring flexible support for individual colorimetric observer models (including human and mouse observers), multiprimary stimulation devices, and solving silent substitution problems with linear algebra and constrained numerical optimization. The toolbox is registered with the Python Package Index and includes example data sets from various multiprimary systems. We hope that PySilSub will facilitate the application of silent substitution in research and clinical settings

    Pulse trains to percepts: The challenge of creating a perceptually intelligible world by electrically stimulating visual cortex

    No full text

    Visual Cortex: The Continuing Puzzle of Area V2

    Get PDF
    AbstractSurprisingly little is known about the role of V2 in visual processing. A recent study found that the responses of V2 neurons to pairs of angled lines could be predicted from their responses to the individual line components. A simple analysis shows how these neurons may simply sum the responses from one or more orientation selective V1 neurons

    Dividing attention across opposing features normalizes fMRI responses in visual cortex

    No full text
    • …
    corecore